Research Article | Open Access
Volume 2025 |Article ID 100058 | https://doi.org/10.1016/j.plaphe.2025.100058

Location-guided lesions representation learning via image generation for assessing plant leaf diseases severity

Ya Yu,1,2 Xingcai Wu,1,2 Peijia Yu ,1 Qiaoling Wan,1 Yujiao Dan,1 Yuanyuan Xiao,1 and Qi Wang 1

1State Key Laboratory of Public Big Data, College of Computer Science and Technology, Guizhou University, Guiyang, China
2These authors contributed equally to this work.

Received 
01 Dec 2024
Accepted 
20 May 2025
Published
26 May 2025

Abstract

Accurate assessment of plant leaf disease severity is crucial for implementing precision pesticide application, which in turn significantly enhances crop yields. Previous methods primarily rely on global perceptual learning, often leading to the misidentification of non-lesion regions as lesions within complex backgrounds, thereby compromising model accuracy. To address the challenge of background interference, we propose a location-guided lesion representation learning method (LLRL) based on image generation to assess the severity of plant leaf diseases. Our approach comprises three key parts: the image generation network (IG-Net), the location-guided lesion representation learning network (LGR-Net), and the hierarchical lesion fusion assessment network (HLFA-Net). IG-Net is designed to construct paired images necessary for LGR-Net by utilizing a diffusion model to generate diseased leaves from healthy ones. First, the LGR-Net facilitates the network's focus on the lesion area by contrasting paired images: healthy and diseased leaves, obtaining a pre-trained dual-branch feature encoder (DBF-Enc) that incorporates lesion-specific prior knowledge, providing focused visual features for HLFA-Net. Second, the HLFA-Net, which shares and freezes the DBF-Enc, further fuses and optimizes the features extracted by DBF-Enc, culminating in a precise classification of disease severity. In addition, we construct an image dataset containing three plant leaf diseases from apple, potato, and tomato plants, with a total of 12,098 photos, to evaluate our approach. Finally, experimental results demonstrate that our method outperforms existing classification models, with at least an improvement of 1 % in accuracy for severity assessment, underscoring the efficacy of the LLRL method in accurately identifying the severity of plant leaf diseases. Our code and dataset are available at http://llrl.samlab.cn/.

© 2019-2023   Plant Phenomics. All rights Reserved.  ISSN 2643-6515.

Back to top